Stable recovery of deep linear networks under sparsity constraints
نویسندگان
چکیده
We study a deep linear network expressed under the form of a matrix factorization problem. It takes as input a matrix X obtained by multiplying K matrices (called factors and corresponding to the action of a layer). Each factor is obtained by applying a fixed linear operator to a vector of parameters satisfying a sparsity constraint. In machine learning, the error between the product of the estimated factors and X (i.e. the reconstruction error) relates to the statistical risk. The stable recovery of the parameters defining the factors is required in order to interpret the factors and the intermediate layers of the network. In this paper, we provide sharp conditions on the network topology under which the error on the parameters defining the factors (i.e. the stability of the recovered parameters) scales linearly with the reconstruction error (i.e. the risk). Therefore, under these conditions on the network topology, any successful learning tasks leads to robust and therefore interpretable layers. The analysis is based on the recently proposed Tensorial Lifting. The particularity of this paper is to consider a sparse prior. As an illustration, we detail the analysis and provide sharp guarantees for the stable recovery of convolutional linear network under sparsity prior. As expected, the condition are rather strong.
منابع مشابه
A Sharp Sufficient Condition for Sparsity Pattern Recovery
Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient con...
متن کاملIRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Blind Compressed Sensing
The fundamental principle underlying compressed sensing is that a signal, which is sparse under some basis representation, can be recovered from a small number of linear measurements. However, prior knowledge of the sparsity basis is essential for the recovery process. This work introduces the concept of blind compressed sensing, which avoids the need to know the sparsity basis in both the samp...
متن کاملFast Algorithms for Sparse Recovery with Perturbed Dictionary
In this paper, we account for approaches of sparse recovery from large underdetermined linear models with perturbation present in both the measurements and the dictionary matrix. Existing methods have high computation and low efficiency. The total least-squares (TLS) criterion has welldocumented merits in solving linear regression problems while FOCal Underdetermined System Solver (FOCUSS) has ...
متن کاملThe high order block RIP condition for signal recovery
In this paper, we consider the recovery of block sparse signals, whose nonzero entries appear in blocks (or clusters) rather than spread arbitrarily throughout the signal, from incomplete linear measurement. A high order sufficient condition based on block RIP is obtained to guarantee the stable recovery of all block sparse signals in the presence of noise, and robust recovery when signals are ...
متن کاملA Unified Framework for Identifiability Analysis in Bilinear Inverse Problems with Applications to Subspace and Sparsity Models
Bilinear inverse problems (BIPs), the resolution of two vectors given their image under a bilinear mapping, arise in many applications. Without further constraints, BIPs are usually ill-posed. In practice, properties of natural signals are exploited to solve BIPs. For example, subspace constraints or sparsity constraints are imposed to reduce the search space. These approaches have shown some s...
متن کامل